home *** CD-ROM | disk | FTP | other *** search
- Short: Auto download or link-check entire Web sites! (v1.00)
- Author: Chris S Handley (cshandley-REMOVETHIS@ANDTHIS_iee.org)
- Uploader: Chris S Handley (cshandley-REMOVETHIS@ANDTHIS_iee.org)
- Type: comm/tcp
- Requires: HTTPResume v1.3+, Rexxsupport.library, ARexx
- Version: v1.00
-
- Introduction
- ------------
- Have you ever visited a cool web site & wanted to keep a copy of some/all of it,
- but it would takes ages to find & download all the respective pages/files?
-
- This is the answer!
-
- You supply this ARexx script with the start page URL, and a destination
- directory (which should be empty), and maybe a few other options - and off it
- goes! Note that it needs HTTPResume v1.3+ to work (get from Aminet).
-
- Latest News
- -----------
- This is (probably) my final release, as I have not updated the code in two
- years, and it seems to work very well for me. Since it works well, I have
- finally given it v1.00 status!
-
- Should now work on even more HTML pages due to minor improvements.
-
-
- If you cannot get ARexx to work, please read my warning below about text
- editors.
-
- History
- -------
- v1.00 (22-08-02) - Should now handle URLs with "?" in them. Now considers
- SWF (flash) files as pictures. Final version?
- Added anti-spam stuff to email address :(
- v0.66ß (30-01-00) - Updated docs with new email address, warning about text
- editors causing problems, and other small changes.
- <snip>
-